On the capacity per synapse

نویسندگان

  • Ido Kanter
  • Eli Eisenstein
چکیده

The optimal storage capacity of a perceptron with a finite fraction of sign constrained weights, which are prescribed a priori, is examined. The storage capacity is calculated by considering the fractional volume of weights which can store a set of a N random patterns, where N is the size of the input. It is found that in the case where (1 s ) N weights are sign constrained the capacity is (1 + s ) / 2 a p ( k ) , where a p ( k ) is the maximal storage capacity in Gardner's case and k is a stability parameter. Numerous analyses of neural networks using statistical mechanics tools were published in the last few years. Analogies between such systems and random magnetic spin systems have been studied extensively [ 11. Recently, a new class of networks, so-called multilayer networks, has attracted much attention. The multilayer networks are composed of neurons like binary units interacted in a feedforward fashion, i.e. no loops are allowed in the connected graph. The output of the network can be easily computed by propagating the signals from the input units to the output units. In this process each element is updated according to some transfer function of its input from the previous layer. The prototype of this class of architectures is the perceptron [2], which consists of N binary input elements and one binary output element. We are interested in the embedding of p = U N relations between pairs of input/output. More precisely, the input of the network is a set of p random patterns (f = f 1, i = 1, . . . , N and p = 1, . . . , p and a set of p random binary outputs y* = i l , p = 1, . . . , p . For such a network the learning process is to modify the synaptic weights, { A } , in such a way that The stability parameter K is to ensure robustness to errors in the input or to enlarge the basin of attractions in the case of fully connected networks and was first introduced by Gardner [3]. In the case K > 0 its value is meaningful only when one specifies the normalization of the weights { A } [3,4]. One commonly-used normalization is the spherical normalization N 1 J ; = N (2) j = 1 which is a global constraint. A significant contribution to the perceptron problem was carried out by Gardner who showed that the probability existence of solutions can be deduced from the fractional volume of the parameters { A } which obeys constraints (1) and (2) [3]. Recent 0305-4470/90/ 170935 + 04S03.50 @ 1990 IOP Publishing Ltd L935 L936 Letter to the Editor work applied this method to various cases and here we would like to emphasize two lines of generalizations. In the first approach, the global constraint is replaced by local constraints on each individual weight. This class of problems contains, for instance, the perceptron in the Ising limit [4,5] or in the limit of discrete synaptic weights [6]. In the second approach, local constraints are added in addition to the spherical constraint. An example that belongs to this class is a network with sign-constrained synapses [7]. In this case the sign of all the weights are prescribed a priori. The study of neural networks with local constraints on the weight strengths is motivated from both biological and applications points of view. In this work we will concentrate on the second approach, but the local constraints could differ from one weight to another. More precisely, our perceptron consists of sN unconstrained weights Ji E (-CO, 03) i = 1,. . . , sN (3) and ( 1 s) N sign-constrained weights Ji E ( 0 , ~ ) i=sN+l , . . . , N. (4) The limit s = 1 is Gardner’s case, where each point on the sphere, (2), is available as a solution. The second limit s = 0 is the sign-constrained weights where only the 2 Y N connected region of the sphere is available as a solution. Following Gardner, the relative volume in the weights space for the network defined by (3) and (4) is given by where C is a normalization constant and the theta and the delta functions stand for constraints ( 1 ) and (2) respectively. The computation proceeds as in [3] and [4]. One concentrates on the computation of In V which is a quantity of order N. This quantity is averaged over the quenched distributions of the random input patterns {t f} , in the expectation that the fluctuations of In V from sample to sample are negligible. Using the replica method one should calculate the average over the following quantity where (. , .) stands for the average over the random inputs {tf}. In the thermodynamic limit and within the replica-symmetric ansatz, (6) can be expressed in terms of three order parameters E, 4 and q in the following form

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sequence memory with dynamical synapses

We present an attractor model of cortical memory, capable of sequence learning. The network incorporates a dynamical synapse model and is trained using a Hebbian learning rule that operates by redistribution of synaptic e/cacy. It performs sequential recall or unordered recall depending on parameters. The model reproduces data from free recall experiments in humans. Memory capacity scales with ...

متن کامل

Memristor Bridge Synapse Application for Integrate and Fire and Hodgkin-Huxley Neuron Cell

Memory resistor or memristor is already fabricated successfully using current nano dimension technology. Based on its unique hysteresis, the amount of resistance remains constant over time, controlled by the time, the amplitude, and the polarity of the applied voltage. The unique hysteretic current-voltage characteristic in the memristor causes this element to act as a non-volatile resistive me...

متن کامل

A High-Storage Capacity Content-Addressable Memory and Its Learning Algorithm

Abstrud -Hopfield’s neural networks show retrieval and speed capabilities that make them good candidates for content-addressable memories (CAM’s) in problems such as pattern recognition and optimization. This paper presents a new implementation of a VLSI fully interconnected neural network with only two binary memory points per synapse (the connection weights are restricted to three different v...

متن کامل

Effect of Sulpiride on Dopaminergic Synapse of Dorsal Hippocampus of Morphine-Treated Rats

Background: As previous studies show, several effects of morphine are induced by the dopaminergic system. Sulpiride is a dopamine D2 receptor (DAD2) antagonist widely used in clinics to treat DArelated disorders. DAD2 receptors are abundant at hippocampal cornu ammonis (CA1). Objectives: This study aimed to investigate the possible interaction of morphine and sulpiride on DA synapses in CA1...

متن کامل

Implementation of a programmable neuron in CNTFET technology for low-power neural networks

Circuit-level implementation of a novel neuron has been discussed in this article. A low-power Activation Function (AF) circuit is introduced in this paper, which is then combined with a highly linear synapse circuit to form the neuron architecture. Designed in Carbon Nanotube Field-Effect Transistor (CNTFET) technology, the proposed structure consumes low power, which makes it suitable for the...

متن کامل

Effect of Synaptic Transmission on Viral Fitness in HIV Infection

HIV can spread through its target cell population either via cell-free transmission, or by cell-to-cell transmission, presumably through virological synapses. Synaptic transmission entails the transfer of tens to hundreds of viruses per synapse, a fraction of which successfully integrate into the target cell genome. It is currently not understood how synaptic transmission affects viral fitness....

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001